On the Validity of the Pairs Bootstrap for Lasso Estimators
نویسنده
چکیده
We study the validity of the pairs bootstrap for Lasso estimators in linear regression models with random covariates and heteroscedastic error terms. We show that the naive pairs bootstrap does not consistently estimate the distribution of the Lasso estimator. In particular, we identify two different sources for the failure of the bootstrap. First, in the bootstrap samples the Lasso estimator fails to correctly mimic the population moment condition satisfied by the regression parameter. Second, the bootstrap Lasso estimation criterion does not reproduce the sign of the zero coefficients with sufficient accuracy. To overcome these problems we introduce a modified pairs bootstrap procedure that consistently estimates the distribution of the Lasso estimator. Finally, we consider also the adaptive Lasso estimator. Also in this case, we show that the modified pairs bootstrap consistently estimates the distribution of the adaptive Lasso estimator. Monte Carlo simulations confirm a desirable accuracy of the modified pairs bootstrap procedure. These results show that when properly defined the pairs bootstrap may provide a valid approach for estimating the distribution of Lasso estimators. AMS (2000) Subject Classification: Primary: 62J07; Secondary: 62G09, 62E20.
منابع مشابه
Differenced-Based Double Shrinking in Partial Linear Models
Partial linear model is very flexible when the relation between the covariates and responses, either parametric and nonparametric. However, estimation of the regression coefficients is challenging since one must also estimate the nonparametric component simultaneously. As a remedy, the differencing approach, to eliminate the nonparametric component and estimate the regression coefficients, can ...
متن کاملBootstrap-based Penalty Choice for the Lasso, Achieving Oracle Performance
In theory, if penalty parameters are chosen appropriately then the lasso can eliminate unnecessary variables in prediction problems, and improve the performance of predictors based on the variables that remain. However, standard methods for tuning-parameter choice, for example techniques based on the bootstrap or cross-validation, are not sufficiently accurate to achieve this level of precision...
متن کاملModel Selection, Estimation, and Bootstrap Smoothing
Classical statistical theory ignores model selection in assessing estimation accuracy. Here we consider bootstrap methods for computing standard errors and confidence intervals that take model selection into account. The methodology involves bootstrap smoothing, also known as bagging, to tame the erratic discontinuities of selection-based estimators. A projection theorem then provides standard ...
متن کاملAsymptotic Properties of the Residual Bootstrap for Lasso Estimators
Abstract. In this article, we derive the asymptotic distribution of the bootstrapped Lasso estimator of the regression parameter in a multiple linear regression model. It is shown that under some mild regularity conditions on the design vectors and the regularization parameter, the bootstrap approximation converges weakly to a random measure. The convergence result rigorously establishes a prev...
متن کاملOn the Second Order Behaviour of the Bootstrap of L_1 Regression Estimators
We consider the second-order asymptotic properties of the bootstrap of L_1 regression estimators by looking at the difference between the L_1 estimator and its first-order approximation, where the latter is the minimizer of a quadratic approximation to the L_1 objective function. It is shown that the bootstrap distribution of the normed difference does not converge (eit...
متن کامل